Skip to content

Conversation

itsarijitray
Copy link
Contributor

@itsarijitray itsarijitray commented Oct 8, 2025

Exposing Batch size for kafka destination :

  • Tackle Max.request.size limit breach for kafka destination: We have a use case where a customer is reaching max request size limit (ref) which was not encountered earlier because of stream-mode. Reducing batch size will help the customer be unblocked for now.
  • Mitigate timeouts: There has been an increase in timeouts for certain customers (ref). Increasing batch size might help reduce this in short term.

Testing

Include any additional information about the testing you have completed to
ensure your changes behave as expected. For a speedy review, please check
any of the tasks you completed below during your testing.

  • Added unit tests for new functionality
  • Tested end-to-end using the local server
  • [If destination is already live] Tested for backward compatibility of destination. Note: New required fields are a breaking change.
  • [Segmenters] Tested in the staging environment
  • [Segmenters] [If applicable for this change] Tested for regression with Hadron.

Test instance: https://app.segment.build/arijit-dev/destinations/actions-kafka/sources/http_api/instances/6880c59b62bcbb1aabdd3887/actions

Testing with 10K as batch_size

Test window: https://segment.datadoghq.com/notebook/12973425/kafka-analyisis?range=1463694&start=1759986162381&live=false
Screenshot 2025-10-09 at 11 00 08 AM

Mapping config:
Screenshot 2025-10-09 at 10 57 35 AM

Testing with 200 as batch_size

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant